Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
There is overwhelming research evidence showing that students often struggle with learning key engineering concepts. The Low-Cost Desktop Learning Modules (LCDLMs) are model prototypes of standard industry equipment designed for students to learn some fundamental but abstract engineering concepts in the classrooms. Previous results have shown that students who interact with LCDLMs tend to outperform those who engage in traditional lectures. However, little is known about student profiles and their forms of engagement with this tool. Hence, the present study seeks to investigate the different student profiles that emerge from students working with the LCDLM and the demographic factors that influence student engagement with the tool. Participants (N = 1,288) responded to an engagement survey after working with LCDLMs in engineering classrooms in several states around the United States. We then used a latent profile analysis (LPA) – an advanced statistical approach – to better understand the representation of learner engagement profiles resulting from their self-reported learning engagement beliefs as they reflect on their experience in using LCDLMs. The LPA revealed five distinct profile types – disengaged, somewhat engaged, moderately engaged, highly engaged, and fluctuating engagement. Results showed that those who are more interactive and actively engaged with the LCDLM scored higher on their questionnaire compared to those who passively engaged with the LCDLM. We conclude with a discussion of the theoretical and practical implications of our findings.more » « less
-
Understanding generalization and robustness of machine learning models funda- mentally relies on assuming an appropriate metric on the data space. Identifying such a metric is particularly challenging for non-Euclidean data such as graphs. Here, we propose a pseudometric for attributed graphs, the Tree Mover’s Distance (TMD), and study its relation to generalization. Via a hierarchical optimal transport problem, TMD reflects the local distribution of node attributes as well as the distri- bution of local computation trees, which are known to be decisive for the learning behavior of graph neural networks (GNNs). First, we show that TMD captures properties relevant to graph classification: a simple TMD-SVM performs competi- tively with standard GNNs. Second, we relate TMD to generalization of GNNs under distribution shifts, and show that it correlates well with performance drop under such shifts.more » « less
-
Understanding the generalization of deep neural networks is one of the most important tasks in deep learning. Although much progress has been made, theoretical error bounds still often behave disparately from empirical observations. In this work, we develop margin-based generalization bounds, where the margins are normalized with optimal transport costs between independent random subsets sampled from the training distribution. In particular, the optimal transport cost can be interpreted as a generalization of variance which captures the structural properties of the learned feature space. Our bounds robustly predict the generalization error, given training data and network parameters, on large scale datasets. Theoretically, we demonstrate that the concentration and separation of features play crucial roles in generalization, supporting empirical results in the literature. The code is available at https://github.com/chingyaoc/kV-Margin.more » « less
An official website of the United States government

Full Text Available